Bi-Causal Recurrent Cascade Correlation

نویسندگان

  • Alessio Micheli
  • Diego Sona
  • Alessandro Sperduti
چکیده

Recurrent neural networks fail to deal with prediction tasks which do not satisfy the causality assumption. We propose to exploit bi-causality to extend the Recurrent Cascade Correlation model in order to deal with contextual prediction tasks. Preliminary results on artificial data show the ability of the model to preserve the prediction capability of Recurrent Cascade Correlation on strict causal tasks, while extending this capability also to prediction tasks involving the future.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution

It is often difficult to predict the optimal neural network size for a particular application. Constructive or destructive methods that add or subtract neurons, layers, connections, etc. might offer a solution to this problem. We prove that one method, recurrent cascade correlation, due to its topology, has fundamental limitations in representation and thus in its learning capabilities. It cann...

متن کامل

Finite State Automata that Recurrent Cascade-Correlation Cannot Represent

This paper relates the computational power of Fahlman' s Recurrent Cascade Correlation (RCC) architecture to that of fInite state automata (FSA). While some recurrent networks are FSA equivalent, RCC is not. The paper presents a theoretical analysis of the RCC architecture in the form of a proof describing a large class of FSA which cannot be realized by RCC.

متن کامل

E-RNN: Entangled Recurrent Neural Networks for Causal Prediction

We propose a novel architecture of recurrent neural networks (RNNs) for causal prediction which we call Entangled RNN (E-RNN). To issue causal predictions, E-RNN can propagate the backward hidden states of Bi-RNN through an additional forward hidden layer. Unlike a 2-layer RNNs, all the hidden states of E-RNN depend on all the inputs seen so far. Furthermore, unlike a Bi-RNN, for causal predict...

متن کامل

The Recurrent Cascade-Correlation Architecture

Recurrent Cascade-Correlation (RCC) is a recurrent version of the Cascade-Correlation learning architecture of Fahlman and Lebiere [Fahlman, 1990]. RCC can learn from examples to map a sequence of inputs into a desired sequence of outputs. New hidden units with recurrent connections are added to the network one at a time, as they are needed during training. In effect, the network builds up a fi...

متن کامل

Recent Advances in Continuous Speech Recognition Using the Time-Sliced Paradigm

We developed a method called Time-Slicing [1] for the analysis of the speech signal. It enables a neural network to recognize connected speech as it comes, without having to fit the input signal into a fixed time-format, nor label or segment it phoneme by phoneme. The neural network produces an immediate hypothesis of the recognized phoneme and its size is small enough to run even on a PC. To i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000